Geometric Optimization April 12 , 2007 Lecture 25 : Johnson Lindenstrauss Lemma
نویسندگان
چکیده
The topic of this lecture is dimensionality reduction. Many problems have been efficiently solved in low dimensions, but very often the solution to low-dimensional spaces are impractical for high dimensional spaces because either space or running time is exponential in dimension. In order to address the curse of dimensionality, one technique is to map a set of points in a high dimensional space to another set of points in a low-dimensional space while all the important characterisitics of the data set are preserved. In this lecture, we will study Johnson Lindenstrauss Lemma. Essentially all the dimension reduction techniques via random projection rely on the Johnson Lindenstrauss Lemma.
منابع مشابه
236779: Foundations of Algorithms for Massive Datasets Lecture 4 the Johnson-lindenstrauss Lemma
The Johnson-Lindenstrauss lemma and its proof This lecture aims to prove the Johnson–Lindenstrauss lemma. Since the lemma is proved easily with another interesting lemma, a part of this lecture is focused on the proof of this second lemma. At the end, the optimality of the Johnson–Lindenstrauss lemma is discussed. Lemma 1 (Johnson-Lindenstrauss). Given the initial space X ⊆ R n s.t. |X| = N , <...
متن کاملApplications of the Gaussian Min-Max theorem
We show how the Gaussian min-max theorem provides direct proofs of several famous results in asymptotic geometric analysis, such as, the Dvoretzky theorem, the Johnson-Lindenstrauss Lemma, Gluskin’s theorem on embedding in `1 , and others.
متن کاملLecture 6 : Johnson - Lindenstrauss Lemma : Dimension Reduction
Observer that for any three points, if the three distances between them are given, then the three angles are fixed. Given n−1 vectors, the vectors together with the origin form a set of n points. In fact, given any n points in Euclidean space (in n−1 dimensions), the Johnson-Lindenstrauss Lemma states that the n points can be placed in O( logn 2 ) dimensions such that distances are preserved wi...
متن کاملLecture 5 — September 8 , 2016
1 Overview In the last lecture we took a more in depth look at Chernoff Bounds and introduced subgaussian and subexponential variables. In this lecture we will continue talking about subgaussian variables and related random variables – subexponential and subgamma, and finally we will give a proof of famous Johnson-Lindenstrauss lemma using property of subgaussian/subgamma variables. Definition ...
متن کاملUsing the Johnson-Lindenstrauss lemma in linear and integer programming
The Johnson-Lindenstrauss lemma allows dimension reduction on real vectors with low distortion on their pairwise Euclidean distances. This result is often used in algorithms such as k-means or k nearest neighbours since they only use Euclidean distances, and has sometimes been used in optimization algorithms involving the minimization of Euclidean distances. In this paper we introduce a first a...
متن کامل